Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove default optimizer, add None optimizer option #1279

Merged
merged 7 commits into from
Apr 2, 2020

Conversation

ethanwharris
Copy link
Member

@ethanwharris ethanwharris commented Mar 29, 2020

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

What does this PR do?

  • no longer automatically choose an optimizer
  • will run without an optimizer and warn the user if configure_optimizers is not overriden or returns None
  • refactored default optimizer to the init_optimizers method
  • refactored init_optimizers and configure_schedulers to their own mixin in seperate file (optimizers.py)
  • changed init_optimizers to just take model so that configure_optimizers is only called in one place

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@ethanwharris ethanwharris requested a review from a team March 29, 2020 11:53
@@ -905,24 +904,25 @@ def configure_apex(self, amp, model, optimizers, amp_level):

return model, optimizers

def configure_optimizers(self) -> Union[
def configure_optimizers(self) -> Optional[Union[
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what if we just make the whole function optional? instead of just the args? or is that optional syntax doing that?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

or i guess if we make the method still required but optional return, then it maintains readability? on second thought i kind of like this better. that way it’s explicit in code that you used no optimizer

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, this does that - or at least, PyCharm isn't prompting me to implement the method :) - the optional thing just means you wont get a type warning if you return None

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

^^ we can do that - currently method isn't required but happy to change that :) agree that this behaviour should be explicit

@codecov
Copy link

codecov bot commented Mar 29, 2020

Codecov Report

Merging #1279 into master will increase coverage by <1%.
The diff coverage is 89%.

@@           Coverage Diff           @@
##           master   #1279    +/-   ##
=======================================
+ Coverage      92%     92%   +<1%     
=======================================
  Files          62      62            
  Lines        3245    3173    -72     
=======================================
- Hits         2970    2905    -65     
+ Misses        275     268     -7

@Borda Borda added the feature Is an improvement or enhancement label Mar 29, 2020
@Borda Borda added this to the 0.7.2 milestone Mar 29, 2020
Copy link
Member

@Borda Borda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you pls sync work with #1269

pytorch_lightning/trainer/optimizers.py Show resolved Hide resolved
Co-Authored-By: Jirka Borovec <Borda@users.noreply.github.com>
@ethanwharris ethanwharris changed the title Optimizer warnings [blocked by #1269] Optimizer warnings Mar 29, 2020
@mergify
Copy link
Contributor

mergify bot commented Mar 29, 2020

This pull request is now in conflict... :(

@mergify mergify bot requested a review from a team March 30, 2020 22:33
@Borda Borda changed the title [blocked by #1269] Optimizer warnings Optimizer warnings Mar 31, 2020
@pep8speaks
Copy link

pep8speaks commented Apr 2, 2020

Hello @ethanwharris! Thanks for updating this PR.

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2020-04-02 12:38:16 UTC

@ethanwharris ethanwharris changed the title Optimizer warnings Remove default optimizer, add None optimizer option Apr 2, 2020
@williamFalcon williamFalcon merged commit 28242f0 into Lightning-AI:master Apr 2, 2020
alexeykarnachev pushed a commit to alexeykarnachev/pytorch-lightning that referenced this pull request Apr 3, 2020
* Add warning when using default optimizer

* Refactor optimizer tests to test_optimizers

* Remove default optimizer, add option to use no optimizer

* Update CHANGELOG.md

* Update pytorch_lightning/trainer/optimizers.py

Co-Authored-By: Jirka Borovec <Borda@users.noreply.github.com>

* Fix style

Co-authored-by: Jirka Borovec <Borda@users.noreply.github.com>
@Borda Borda modified the milestones: v0.7., v0.7.x Apr 18, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Is an improvement or enhancement
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants